Subdomains not working? PoliticsFilter, FlashFilter, ArtFilter redirect me to the front page. March 19, 2006 4:26 AM   Subscribe

Subdomains not working? PoliticsFilter, FlashFilter, ArtFilter redirect me to the front page.
posted by nims to Bugs at 4:26 AM (36 comments total)



There used to be. When did this handy functionnality disappeared?
posted by nims at 5:10 AM on March 19, 2006




Yes, there used to be. And it was sweet.
posted by fire&wings at 5:28 AM on March 19, 2006


Well.. I forgot about that. Now I am intrigued.
posted by dash_slot- at 5:29 AM on March 19, 2006


It worked the other day when I accidentally punched in www.ask.metafilter.com.
posted by dobbs at 6:53 AM on March 19, 2006


spaghetticodefilter.
posted by quonsar at 7:22 AM on March 19, 2006



The dog did it.
posted by caddis at 7:34 AM on March 19, 2006


I turned it off. It was fun but no one seemed to really use it and I only got feedback on how it should change. While trying to chase down some server errors the other day, I figured I might as well try rolling back that feature to see if it alleviated the load on the server.
posted by mathowie (staff) at 7:37 AM on March 19, 2006


I was just trying to use it earlier today, and was wondering why it wasn't working. If it's not an issue, I think that it'd be appreciated if it were to return.
posted by duende at 8:01 AM on March 19, 2006


It was classy. I guess I should just update my bookmarks. Thanks for the quick response.
posted by nims at 8:05 AM on March 19, 2006


I wasn't a big fan of it. Tags are too imperfect to use them as the basis for subdomains.
posted by danb at 8:36 AM on March 19, 2006


Also, didn't it cause problems with search engines? I seem to remember that Yahoo ended up cataloging every subdomain of the site so that each page appeared multiple times.
posted by blag at 9:09 AM on March 19, 2006


...no one seemed to really use it...
...try rolling back that feature to see if it alleviated the load on the server...


*head asplodes*
posted by quonsar at 9:10 AM on March 19, 2006


Also, didn't it cause problems with search engines? I seem to remember that Yahoo ended up cataloging every subdomain of the site so that each page appeared multiple times.

Nope, I was telling robots to ignore all non-www domains.

*head asplodes*

I had to make one big change on the server to make the feature work, which had some bad side effects. It appears that turning off the tag front page stuff has alleviated the ripple effects, so while not too many people were using it, there was a strain from some unintended consequences.
posted by mathowie (staff) at 9:34 AM on March 19, 2006


What about doing something like www.metafilter.com/tags/art and www.metafilter.com/tags/politics? It's essentially the same thing without the subdomains. Then again, I don't know what the strain from the unintended consequences was, so...
posted by RustyBrooks at 9:40 AM on March 19, 2006


What about doing something like www.metafilter.com/tags/art

Buh?
posted by Gator at 10:05 AM on March 19, 2006


Neat idea RustyBrooks!!1
posted by odinsdream at 10:44 AM on March 19, 2006


My idea was so good, it went back in time.
posted by RustyBrooks at 12:11 PM on March 19, 2006


I can't post comments on:
http://metatalk.metafilter.com/mefi/11523

There's an error in http://www.metafilter.com/robots.txt (everything after the <cfelse> shouldn't be there)
posted by Sharcho at 4:20 PM on March 19, 2006


I didn't think that the subdomains was a good idea anyway. (quoting from the earlier thread):

This is not what subdomains are meant for, and it's very limited. Most people would like a customized front page but domain names place too many limits -- can't exclude (not), can't combine (and), can't unify (or), doesn't support unicode, can't add additional filters (user/date/etc., can't highlight stories, can't tweak the layout, can't use it for AskMe/MetaTalk/Projects and there's no way to add these features in a clean way later.
posted by Sharcho at 4:25 PM on March 19, 2006


Did you guys hear something?
posted by Gator at 4:40 PM on March 19, 2006


that is the sound of one hand clapping
posted by blue_beetle at 5:19 PM on March 19, 2006


Wow sharcho, you really know how to do the "I told you so" dance.
posted by mathowie (staff) at 5:25 PM on March 19, 2006


mathowie, It's not a "I told you dance". I just wanted to back up your position that it's not such a great loss.

You should disable the wildcard DNS.
http://tagname.metafilter.com/ returns the content of www.metafilter.com , which means that it all the robots will crawl all the subdomains again.
posted by Sharcho at 5:32 PM on March 19, 2006


Mathowie wrote:

I was telling robots to ignore all non-www domains.
posted by SuperNova at 8:26 PM on March 19, 2006


Hmm. I can't seem to figure out exactly the wildcard.metafilter.com redirect to www.metafilter.com. Any apache users got that handy?
posted by mathowie (staff) at 9:52 PM on March 19, 2006


mathowie,

add a VirtualHost block after all the other VirtualHost blocks with *:80 that points to a /norobots directory. The directory should contains only a robots.txt file that excludes everything.


<VirtualHost *:80>
DocumentRoot /www/docs/norobots
common
...
</VirtualHost>

posted by Sharcho at 3:55 AM on March 20, 2006


(ignore the common line)
posted by Sharcho at 3:58 AM on March 20, 2006


You might also need to add a <VirtualHost metafilter.com:80> block to handle the 301 redirects from metafilter.com -> www.metafilter.com
posted by Sharcho at 4:06 AM on March 20, 2006


BTW: if you need help with these things feel free to E-mail me
posted by Sharcho at 4:11 AM on March 20, 2006


I don't care about robots.txt, I just want www2.metafilter.com, kottke.metafilter.com and anything.metafilter.com to redirect (rewrite in the browser's address bar) to www.metafilter.com. So any old link to a non-www URL will end up at the proper URL and there will only be one www URL in Google.
posted by mathowie (staff) at 7:45 AM on March 20, 2006


You need the robots.txt anyway and following the instructions I wrote above.
In the directory where you put the robots.txt also create the following .htaccess file.


RewriteEngine on
Options -Indexes
# If domain name != 'www.metafilter.com' and requested file != 'robots.txt' then
# redirect with 301 code to www.metafilter.com
RewriteCond %{HTTP_HOST} !^www\.metafilter\.com$ [nocase]
RewriteCond %{REQUEST_URI} !^/robots.txt$ [nocase]
RewriteRule ^(.*)$ http://www.metafilter.com/$1 [redirect=301,last]

posted by Sharcho at 12:23 PM on March 20, 2006


Nope, I was telling robots to ignore all non-www domains.

Hmm.
posted by blag at 3:42 PM on March 20, 2006


blag, those happened while there were no robots.txt on these subdomains. Matt did put a robots.txt exclusion but content that has already been crawled when there was no robots.txt still appears in the search. If it won't be fixed soon, there will be more of them appearing in the results because now there's no robots.txt for the subdomains.

Google and Yahoo (slowly) remove results that are excluded by using the robots.txt. That's why it is important to have the robots.txt exclusions even after setting the redirects.
posted by Sharcho at 4:06 PM on March 20, 2006


The robots stuff must have been broken until quite recently, then, since one of the Yahoo results is for "blomkamp.metafilter.com" and the only post with the blomkamp tag was made in January of this year.

Still, good to know that the 371,000 useless extra results will disappear eventually :)
posted by blag at 4:56 PM on March 20, 2006


« Older Bad parodies make bad posts   |   "stumped" askme is a good reason for doubles Newer »

You are not logged in, either login or create an account to post comments