A follow-up to My Letter to Google

I’ve been having a helpful exchange with Matt Cutts about the difficulty I’ve been having with my site http://www.hystersisters.com being indexed in Google search engine. It all started because I posted a public letter to Google. I liken it to dragging rocks to the beach to spell out S.O.S. in the hopes of being rescued.I asked Matt if perhaps when I received a penalty in spring of 2006 and submitted a request for re-inclusion if perhaps I was released to Google purgatory instead of receiving complete grace.Matt’s reply: (Bless him!)

Kathy, I checked a while back and after leaving behind that particular link exchange, you got full grace. It would be helpful if you would do a follow-up post on your site to talk about what you’ve done with vbseo type stuff. My best guess is that in the process of trying to do whitehat SEO or streamlining urls, some change was made in the site architecture that block some pages in robot.txt or causes other issues. So a post that says “my urls used to look like bla. Then I made this change and now my urls look like foo.” might help.

My only frustration is not knowing exactly how to answer Matt’s questions but I’m going to give it a good try since I haven’t used vbseo and my urls haven’t changed (that I’m aware of – unless they changed from vb2 to vb3 by normal upgraded structure.)First: Robot.txt As you see I block such pages like member profiles, memberlist, attachments, register, search, picts, usercontrol panel pages. Over 80,000 pages are blocked for these reasons. We have over 127,000 member profiles blocked. Some threads are indexed and they are considered to be: vb2/showthread or vb2/post . Neither of these are blocked from the bots. I’ve checked the most recent threads in webmasters/tools: Test URLs against this robots.txt file. They are all allowed according to the tool.Site recent history and changes:I upgraded the Hystersisters website from vb 2 to vb 3 during the summer of 2005. Up until then and even after that point I was well indexed in Google, Yahoo!, AOL, MSN and the rest. In fact, Hystersisters was usually the #1 result for keyword hysterectomy and all phrases hysterectomy related.Sometime during the winter of 2005/2006 (I believe if I can recall properly) I added dp co-op links to the footer of my site and received a penalty from Google in the spring of 2006. Before anyone sits horrified at the notion I would do this, I innocently joined the link exchange, being told it was fine and dandy. When I learned of the penalty my first reaction was anger at Google.Then when I realized what the ramifications were for being connected to the bad websites represented in my category, I was royally miffed but not at Google. At myself for not checking it out more thoroughly to discover it is against “good webmaster practices”. At DP for not disclosing the possibly bad sites.I removed the bad link script. I submitted the reinclusion request to Google and received gracious replies from team Google. I thought all would be well.The following summer in 2006 I discovered that my pages in Google’s search results were still not showing up to reflect the number of pages on the site. Threads that were 2 months old were not showing up but threads 6 years old were there.Many of my comrades in the forum industry use vbSEO as a whitehat method to alter the thread urls in their forums. Many others do not. I chose not to use vbSEO because in the beginning it was an encrypted product. I wouldn’t allow it to be installed on my server because we couldn’t verify it wasn’t collecting data of some sort, sending it back to some “mother-ship”. I’m paranoid that way. Evidence: My server has a firewall. Added security measures. I comply with entities like the Hon Code folks. Credit card merchant compliances. Scan Alert. Hacker Safe.But lo and behold, vbseo Juan, came out with a vbseo sitemap for anyone using vbulletin. My web-developer installed it, configured it. And initially it seemed to help. My indexed pages grew to about 80K pages. Good deal. I was on a roll, heading towards indexing all 300,000 of our discussions. BTW, vbseo is no longer encrypted as of a few weeks ago. I’m still not convinced I need to install vbseo since most of forum owner friends that do not use vbseo have more pages indexed than those who do. Go figure. I think Juan has done a remarkable thing with the vbseo tool. I’m just not convinced its required. Fast forward another year to the summer of 2007. We changed the Hystersisters skin to make it less heavy in graphics. The skin we were using with the 2005 upgrade was beautiful but heavy. A new skin would make pages load quicker.Also in the summer of 2007 I signed up for scanalert (hacker safe) which required a firewall on the server. More server configurations to comply. With two servers for hystersisters.com, the connection between the two servers caused some DNS zone problems which the system admin fixed for us, trying to make sure we were good on DNS reporting.I thought we would doing well. Indexed pages were still between 30K and 80K. I had hopes of finding the magic egg to get all the pages indexed. I had hoped the lighter skin would help the bot get through the pages faster. If not the bot, it sure helped our visitors!And then October 2007:I woke up to discover another leak in numbers. We were hovering between 12K and 13K indexed pages in Google. Digging around I’ve found that the vbseo sitemap was no longer creating the sitemap to Google’s bot standards. I removed it from my webmasters/tools and have tried to find an alternative. Yes, vbseo upgraded their version of sitemap but it doesn’t include my older version of vbulletin. Yes, indeed. I am using an older version of vbulletin because of the extensive customizations done to the software for the benefit of my members.We do cool stuff with member’s hysterectomy dates, sending automated emails to them along their time line to provide additional information for them to discuss with their doctors, checklists of great info, reminders to “pamper the princess!” Yes, they sign up for “hysterectomy checkpoints” . Sorry, I digress. Its great stuff for our members but means that upgrading software is a challenge. The last upgrade in 2005 took 3 months of re-coding to add in all of our features.So, Matt, my urls never changed structure and neither did my site. I’ve closed the archive of my forums to send all visitors from the archive to the real thread discussion. I’ve tried to find a replacement for my sitemap. I’ve set the “crawl site at faster rate” at google.com/webmasters/tools suggestion. I’ve sent it back to normal. I’ve set it faster again.I’ve asked my system administrator to check the servers to make sure there is not a setting configured to block out the bot. According to the webmasters/tools the bot does visit. It just doesn’t stick around for long!I’m at a loss to know what to do or how to correct this dilemma. HysterSisters is considered an authority site within the niche and I would hope to find a way for our pages to be indexed to make sure women who need us, can find us.Thanks for listening, Google! And Matt, for your interest in my case, I call you my hero… especially if you can put me on the path to full indexing.


  1. Kathy, thanks so much for undertaking this so professionally with Matt Cutts. Very sorry for your indexing problem. Scary!

    But your communication will likely help many, many other webmasters with the information coming out of the dialogue.

    (Following closely and waiting for “solution day” to arrive. All best!!)

  2. Your forum page has “alot” of errors on Validator:


    One of these may be snagging Googlebot and causing him to head to the hills.

    Also i don’t like your Robots.txt formatting, have a look here as an example of a really great vB robots.txt


    If you correct these issues.

    Also i would seriously think about adding /showpost.php to your Robots.txt

    This is the individual post, and is really duplicate content of what’s in your thread. If every thread page has 20 posts, that means you have another 20 URL’s of that content, one new page for every reply with showpost.php

    It’s far better to not waste the link equity on 20 pages, but rather have 20 times the authority being passed to your showthread.php page.

    Also see what entering one of these showpost.php URL’s from the SERP’s is like:


    Not real good in getting visitors to your board is it? Most people don’t even know theres a link to the full version so read it and hit the back button.

    You have 2.3 Million of these URL’s chewing up link authority.

    Also i would seriously consider blocking calendar.php

    There’s thousands of pages there that are all fairly duplicate chewing up link equity/authority.

    Does your Calendar serve up content you want to rank for, and you want first time visitors from the SERP’s to land on? The answer is probably not.

    The idea is not to have the biggest number of pages in Google, but to serve up the single best version of your content pages and have all your link authority going to these pages. As it is now it’s being spread to probably 3 Million other URL’s needlessly.

    Also Google has a “Crawl Budget” where it assigns a crawl of “x” number of pages per day. Do you want Google his x number of pages for the next 3 days on your Calendar, or on your new posts and informative content?

    Your problem is easily fixable, i’d like to write more but must head out shopping but this gives you something to go on.

Comments are closed.