4 posts tagged with searchengines.
Displaying 1 through 4 of 4. Subscribe:
Search and Destroy
I've been contemplating an Ask post for a while now, that deals with certain issues that I don't really want to appear in a search index... [more inside]
Yahoo vs Google
Why is it that the Yahoo index of MetaFilter is better than that of Google? I know that when the search box was switched over to Yahoo no one really knew. Any new ideas?
Google Search
'Your original search: beer bottle betadene was misspelled and returned 0 results. The corrected search: beer bottle betadine was done instead and the results appear below.' Has Google always done this (ie, this just happens to be my first misspelled search to score zero hits), or is it something new?
To Index or Not?
The site is getting pummelled lately, so I ran stats on the past few days to see if there was a national news story or something. Of the 300k page views in the past four days, 100k, or 1/3 of the traffic was solely due to the googlebot.
It appears that having 13k threads filled with 200k comments of google-loving ascii is acting as some sort of honeypot, attracting the google indexers like mad. Broken down by day, the Googlebot appears to visit over 25k pages at metafilter.com PER DAY. If you look at browser/OS stats, the googlebot visits metafilter more often than all Netscape clients combined. Also, the googlebot exceeds all visits by people using Mac operating systems.
Although I'm impressed with the results (google searches are the #1 referrer), is it worth basically bringing down the machine and keeping humans from being able to access it? If I were to include a robots exlusion file and block all search bots, would the net community be at a loss for not being able to find information discussed here?
I guess the big question is, does the utility of having the site indexed outweigh the problems the indexing causes?
It appears that having 13k threads filled with 200k comments of google-loving ascii is acting as some sort of honeypot, attracting the google indexers like mad. Broken down by day, the Googlebot appears to visit over 25k pages at metafilter.com PER DAY. If you look at browser/OS stats, the googlebot visits metafilter more often than all Netscape clients combined. Also, the googlebot exceeds all visits by people using Mac operating systems.
Although I'm impressed with the results (google searches are the #1 referrer), is it worth basically bringing down the machine and keeping humans from being able to access it? If I were to include a robots exlusion file and block all search bots, would the net community be at a loss for not being able to find information discussed here?
I guess the big question is, does the utility of having the site indexed outweigh the problems the indexing causes?
Page:
1