Posted on December 7, 2009 by Katherine
Fascinating article by Randy from SEOMoz about how Google potentially caps the # of URLs it will index from your site.
This is particularly a problem for my team at EWG for a few reasons:
- This organization has been online since the beginning of the Internet, and old reports are still live online. Walking staff through which pages should be live and which could be retired can be a time consuming process
- Until I came on board, they did not have an online marketing team, so that robot.txt file was not utilized (and we are still working on it). That means that for intensive search features, Googlebot will try to execute the searches and follow the results – wasting time on pages you may not want indexed.
- I’ve spent a bit of time educating staff about being conservative about putting up pages that we want visitors to read, but don’t necessarily want indexed (like our references pages)
- And the fine tuning that you need to do for WordPress blogs are intense. There’s a great video here that walks you through some of that.
- Leading all of this is the challenge of focusing an organization around where they would like to be found online. EWG has an idea now, but many nonprofits don’t. Without clear keyword targets, it’s hard to streamline online content to work for the organization instead of wasting the Googlebot’s time.
If you haven’t read the article, I would. And if you have any tools for managing such a challenge for a page with a HUGE number of pages, let me know.
Filed under: future of search, Google, SEO | Tagged: Google, google indexing cap, google's index, robot.txt, SEO | Leave a Comment »
Posted on December 3, 2008 by Katherine
Thought this was a brilliant analogy about the difference between SEO and PPC, and it illustrates why I’m a SEO advocate.
PPC = renting the land
SEO = Owning it
Filed under: SEM, SEO | Tagged: difference between Seo and ppc, PPC, SEO | 1 Comment »