Friday, August 31, 2007

Guest posts on The Google Watchdog

Do you have a complaint about Google? Want to write about some of your experiences with Google that have been positive, aggravating, or eye-opening? I'm going to give 3 webmasters the chance to write their own guest post on The Google Watchdog in the next week.

If you are interested, you can contact me through my web design site or post a comment here. The rules are simple - make your post informative without being spammy. I may be a DOFOLLOW blog, but more than 2 outbound links is too many.

This could be a great chance for some of you to vent your frustrations with Google or to promote your new blog.

Thursday, August 30, 2007

A fantastic SERP tool for webmasters

I've said it before, but I will reiterate: Pagerank is nothing compared to your SERP ranking. What's the point of having high Pagerank if your site doesn't rank well for any keywords.

One of the issues I ran into when starting out to learn SEO was finding where I ranked for specific tools. I tried IBP, but it was slow and I'd heard some things about it that I won't repeat here that warned me away from it. I also tried some online tools that were either too clunky to be useful, or required my to jump through a lot of hoops to use.

The tool I ended up using is the Search Engine Ranking tool at It allows me to search for top 100 SERPs on all three major search engines. (some webmasters complain that they want a tool to show top 1000 SERPs, but any SERP under 30 is practically useless, so the top 100 is fine with me). Ranking Check also has a free tool to track 10 keywords on your sites, and you can pay for additional keyword tracking. The tool also emails me when my ranking has changed. The only downside I found to the tool was the fact that I'm only emailed once per week on ranking changes. A daily email would be nice.

Overall, I rate the tool an 8 out of 10, and would give it an even higher score if the new ranking emails were sent out on a more frequent basis.

(* note - I wasn't paid, coerced, or otherwise compensated in any way for this post. I don't know who created, but I love the tool)

Wednesday, August 29, 2007

A cool new PR tool I found

When browsing (as a user), I often use Pagerank to determine the "authority" level of the site. Maybe I'm brainwashed, but I put a lot of stock in the Pagerank number and trust Google to help me make decisions about what sites are good and which are bad.

But, there are quite a few insidious webmasters out there that fake their Pagerank through various methods. I found a tool that can let you know if a site's Pagerank is fake. Check out the Fake Pagerank tool at Although quite a bit of the text is in French, I've been able to use it without a problem. The author himself posted about the tool over at Digital Point, and says that it checks "Http header + fake Http header + false TLD detector + 3 level of cheating _ fake redirection detector".

I'll be posting a list of some of my favorite webmaster tools in the upcoming days.

My on-page keyword optimization experiment

I recently decided to test some on-page optimization techniques to see how they affected my SERPs for an EXTREMELY competitive keyword (over 350,000 searches per month). A search for this particular keyword returns over a billion results (1.1 billion to be more exact). We'll call this keyword "foo mar shopping"

Before the experiment started, I was ranking about 600 in the world for "foo mar shopping". This number fluctuated between about 400 and 1000, but for the most part had settled into the 600 range. Not too bad for a competitive keyword that I hadn't optimized for at all.

Here's what I did on-page to optimize for the keyword. Most webmasters will recognize these techniques - they're strictly newbie level things.

  • Added "foo mar shopping" to the front of my title tag.
  • Rewrote my description META tag so that it included the keyword in it twice
  • Also rewrote my keywords META tag so that it contained 4 variations of the keyword, including the actual keyword itself first in the list
  • Added the keyword as the ALT tag on every image on the page
  • Changed my style sheet so that H1 text showed up as only slightly larger than the other text, and bolded. I then added two separate column headers with the keyword using the H1 tag.
  • I peppered my content with the keyword in a natural way. The keyword took up approximately 5% of my content.
  • Where appropriate, I bolded, italicized, or underlined the keyword.
  • All outbound links on the page had the "title" attribute added to the A tag with the title text being the exact keyword.
  • I created an attractive bulleted list that was useful to the user and also included several variations of the keyword in the list items.
  • On every page of my website, I linked back to the target page using the keyword as the anchor text.
So, I pulled some pretty uninventive tricks out of my bag. These are things that SEOers have been doing for years. My experiment was to see where my SERPs went for the keyword; I checked the results about every 3 days. Here are the results:
  1. Day 3, first SERP check - ranking at about #500. Not significant because the normal SERP fluctuations had actually given me a higher ranking than this.
  2. Day 7, 2nd SERP check - here's where the largest jump occurred. My Google SERP for the keyword jumped up to 110.
  3. Day 9, 3rd SERP check - ranked #72 (yeah, broke that mystical "top 100" barrier"!)
  4. Day 13, 4th SERP check - ranked #64
  5. Day 15, 5th SERP check - ranked #74 (dropped here for some reason) {shrug}
  6. Day 20, 6th SERP check - ranked #51
  7. Day 25, 7th SERP check - ranked #49
  8. Day 30, 8th SERP check - (this is today) ranked #39

The results themselves seem to suggest that correct on-page optimization can significantly help in raising the SERPs of an averagely ranked site. From #600 to #39 in one month is a pretty good result.

Just an FYI: this page only has a pagerank of 1, and I believe with some natural backlinking campaigns can be in the top ten for a keyword that is dominated by high pagerank websites.

Tuesday, August 28, 2007

I've joined the DOFOLLOW movement

Just a quick note to let Google know how much I HATE the rel='NOFOLLOW' attribute added to links. It's especially bothersome that Google has made Blogger a NOFOLLOW blog by default. In an earlier post, I gave instructions on how to change your blog to a DOFOLLOW site.

If I link to another site, I'm "voting" for them just like Google wants. They have something of interest to me, or content that might be useful to other readers. If Google wants to enforce the NOFOLLOW on paid links, so be it, but leave my damn blog code alone!

Top 10 things to avoid during SEO

I asked for some advice from my friends over at DigitalPoint on the worst things to do when performing SEO. I've also added 1 or 2 of my own.

  1. Don't link to or from bad sites
  2. Build your links slowly
  3. Don't "keyword stuff"
  4. Don't start your SEO without a plan - research your keywords, the market, and your competition
  5. Avoid automatic linking software (this is a good way to get your site banned quickly)
  6. Don't build your site for search engines. Instead, build it for your users. A website that is illegible because of all your on-page optimization will be useless to your users.
  7. Don't build a website entirely in Flash - even Flash intros should be avoided.
  8. Stay away from link pages. If you have a page on your site dedicated to linking, remove it and add the links back to relevant parts of your website.
  9. Don't hire incompetent SEO firms - instead, do the research yourself and get trained in real SEO techniques. No one cares about your site more than you.
  10. Any scheme that guarantees you a top rank is guaranteed to get you banned. Stay away from them!
  11. Don't believe everything you hear on the internet. Find what works for you. There is lots of bad advice out there.
  12. Don't get involved in web-rings or link farms.
Okay, so my top ten became a top twelve. Thanks to jodfran over at DP for his great suggestions. Leave any comments if you have other techniques to avoid.

If you disagree with me, leave a comment (on my DOFOLLOW blog!)

Monday, August 27, 2007

Google sitemaps - why isn't there an "official" tool to create one?

I'm not a big fan of sitemaps. If I've built my linking structure correctly, the Googlebot should be able to find and index all my pages in a natural way. However, because I've been trained to jump through the Goops (Google hoops), I build a sitemap for many of my sites. I use an outside tool that sometimes does a good job and sometimes does not. It's a crapshoot that often ends with me doing more work than I should for a process that I question whether actually provides any useful benefit to me.

Why doesn't Google have it's own in-house "official" sitemap builder? An application written by Google engineers who understand the specification behind the sitemap and know the internal workings of the Googlebot that uses the sitemap to traverse websites. This tool could be incorporated into the webmaster tools and provide a way after verifying the site to create the sitemap with a simple click. Also incorporated into this could be an HTML validator that shows errors upfront before they hit the sitemap.

So why hasn't Google done this yet? It can't be a matter of dollars and cents - they have more money than God. Maybe it's a bean-counter thing with the accountants asking "how does this make us any money?". Whatever the reason, it's time for Google to step up and either build their own sitemap creation tool or admit that the sitemaps don't really mean all that much.

Google bans itself!

This is a funny story about Google getting a little bit of it's own medicine: Article

Essentially, Google marked the blog (owned by Google) as spam, notified the blog owners (assumably Google employees) of their spam designation (which they ignored), and deleted the blog after a certain amount of time. A quick thinking user quickly squatted on the domain and posted some odd stuff.

The blog is back up and running now. Google acted very quickly to get the blog back up and retrieve it's domain name from the squatter. Wonder how quickly they'd have reacted had this been just some joe-schmoe webmaster who had made the same mistake? You can bet that it wouldn't have been taken care of this quickly.

Ha. :)

Sunday, August 26, 2007

How to remove the NOFOLLOW from your blog

I hate the fact that the Google owned software defaults to a NOFOLLOW tag, essentially killing any link juice from being passed out of the blog. If I link to someone, there's a reason. They have received a natural, organic link (a 'vote') from me. I like what they have, and I have chosen to link to them. The idiotic NOFOLLOW tells the Googlebot that the link is somehow worthless, and that it shouldn't follow it to index that page. STUPID!.

Here's how to remove the rel='NOFOLLOW' from your Blogger blog. (note: if you use another blogging software, this method won't work - leave me a comment if you have links to other guides on how to remove the NOFOLLOW)

  1. Login to your Blogger blog
  2. Click the 'Customize' button from your blog's home page
  3. Select the 'Template' tab
  4. Click 'Edit HTML'
  5. On the Edit HTML page, make sure that the 'Expand Widget Templates' checkbox is selected
  6. Search for the text rel='nofollow' (click edit->find in this page in Firefox or CTRL-f)
  7. Delete any instances of rel='nofollow' you find. Make sure that you only delete that exact text and none of the other characters around it
  8. Click 'Save Template'
  9. Voila, your blog should now be a fully compliant DOFOLLOW website
If you want to let your readers know that you allow comment links and that your blog is a DOFOLLOW site, you can use one of the logos here:

I Follow Logos

Google pagerank update is near

This is only a guess on my part, but a guess based on the things I see happening"

  1. I have noticed large fluctuations in the SERPs for several of the keywords on my web design site. Often, right before the update, you see these fluctuations as the Google engineers test their algorithm changes and make tweaks.
  2. Some webmasters are noticing a large drop in traffic from Google. While this is something that is happening a lot (especially with"gray area" technique webmasters), even when not during an update, I've seen a large increase across the blogosphere and from webmaster forums about this phenomenon. It's another sign that the Google engineers are testing their nefarious new algo rules.
  3. I've seen a HUGE increase in the amount of trips the Googlebot is making to many of my sites. For example, one of my sites went from 4,000 Googlebot "hits" every 2 weeks to about 80,000 hits in a 5 day period. That's 16,000 per day, and the visits aren't slowing down yet. I would guess that they are rushing to index as much as possible before doing a full update.
  4. It's been almost 4 months since the last update. The longest number of days between updates in the past has been 122, and we're getting very near that number.
  5. Matt Cutts has essentially been on vacation for the last several weeks. Is Google giving him some time off before he has to deal with the fallout that happens after every Google update? Since he's the public face of Google to webmasters, a nice vacation before the abuse begins seems like a practical thing.