screening big sites from FPP October 20, 2001 3:05 PM   Subscribe

What if the URL field on the MetaFilter Post a Link page were screened so it would be impossible or at least difficult to link to the home page of CNN, The Onion, etc.? People could see a message reiterating the guidelines to post to something specific, and it would keep them from annoying people who are annoyed by these kinds of links.
posted by kirkaracha to Feature Requests at 3:05 PM (14 comments total)

I think that the U.S. Government is a perfect example of why this won't work. You can legislate against stupidity, but people will invariably still find their way to it.
posted by fooljay at 8:15 PM on October 20, 2001 [1 favorite]

though most definitely not always the case, sometimes there are events that warrant immediate discussion because it hasn't really hit the normal sites, yet... but like i said, this rarely happens, so it might not be an issue.
posted by lotsofno at 8:57 PM on October 20, 2001

Is it too much to just ignore the posts? Why does it bother people so much?

I think it interests people more than it annoys. Not everyone reads the humor sites all day. Like I said, just ignore it and quit policing everyone else's realm.

Note: I do realize the bandwidth and database overhead implications. That shouldn't have to be an issue though. Now if the web only allowed for a table of contents...
posted by bloggboy at 11:25 PM on October 20, 2001


despite what you would rather have things be, cnn, yahoo, and the onion links do cause increases in bandwidth and database overhead. whether it should or should not "have" to be an issue -- it is an issue, like it or not.

i personally find the complaints about metafilter "policing" absurd. i have a right to say i would rather not see all of these yahoo links as much as you've a right to say that you do, and if i like, i shall exercise my rights.
posted by moz at 12:37 AM on October 21, 2001

I don't think the crux of the matter is that it's a bandwidth-and-database issue so much as it is an info-space issue. When there are innumerable links to commonplace material cluttering up the mainpage, MetaFilter ceases to be much of a filter at all.
posted by youhas at 12:49 AM on October 21, 2001

Actually, I wouldn't mind seeing some clearer guidelines about what a double post actually is. Yeah, some of it is obvious, but some of it does get smokey to me.

In a site that goes back to 1999, with the growth curve really hitting max around may or so of this year, a double post from any 1999 material would be new material for easily 98% of the user base. Should we throw a flag down?

I was recently called on a 'double post' - not a double post of a 'front page' link, but to a nested link within a discussion from a different front page link. It could be wrong: I'll go either way as equably - I tried a search, and it came back empty, so I'm not feeling terribly villainous . But is a front page link that is meant to spotlight a certain angle or perspective become inadmissible in all cases if the link was included to round out a different discussion in a thread area?

Example: a 'Star Wars - crappy or not' story goes to front page discussion, and within that discussion, someone who says Star Wars is crappy cites a linked interview with Alec Guiness who said he thought 'Star Wars' was crappy. Down the line, someone creates a front page thread about people who became best known for their most-hated work: included in links in the front page threads, links to Ravel on 'Bolero', Anthony Burgess on 'Clockwork Orange', Beethoven on 'Fur Elise', and Alec Guiness on 'Star Wars'. Would this call for a ten yard penalty and loss of down?

Along a similar vein, lets say two people over the course of several months start two different discussions - "What did Jesus Look Like" and "The Shroud of Turin": each link to different inner pages to a 20-page, exhaustive vertical site, "The Life of Christ", on 'NOVA's website. Since we've tagged the vertical once, do we consider the second as a double post?

posted by Perigee at 1:52 AM on October 21, 2001

The best solution to the double-post problem would be to store all posted URLs in a quick-search database (maybe a binary tree) and reject any new post that contains an URL matching an URL in the database.

It probably would reduce double posting enough to stop most of the whiners from whining, and it would require very little processing and no live intervention. If Matt then wanted to police the posts that slipped by, it would be much easier for him to do so.
posted by pracowity at 5:28 AM on October 21, 2001

and it would keep them from annoying people who are annoyed by these kinds of links.

If we tweaked the system to disallow all the kinds of links that annoyed some people, we'd be left with a blank page to stare at.
posted by rushmc at 5:52 AM on October 21, 2001

It's not about who's annoyed or who's not, it's about the quality of site that we want to have around here. It's about putting the filter back in MetaFilter.

It's about not letting this place devolve into an unreadable mess with 300 front page links a day, all of them uninteresting. That's what will happen if we drop our standards.

It's about not letting this place devolve into just another discussion board, where the quality of conversation is on par with It's about keeping the intelligent, informed, civil vibe that we've got going around here. It's about educating our new members, and helping them to raise their expectations of what an internet community can be.

So many internet discussions are inane, or shallow, or rude, or simply boring. It's about making this place better than that.

There are hundreds of sites that you can go to if you want to discuss run-of-the-mill Annonova links. Heck, anybody with a blog and a discussion component can set that up. What makes MetaFilter different is our vision of standards and quality. If we lose that, I think that we've lost the game.
posted by gd779 at 7:21 AM on October 21, 2001

It's not about who's annoyed or who's not, it's about the quality of site that we want to have around here.

"Quality" being determined by, well, you, I suppose?
posted by rushmc at 9:45 AM on October 21, 2001

"Quality" being determined by, well, you, I suppose?

No, by us. The MetaFilter community communicates a message with every post we make. The only question is, what are we communicating?

What standard of quality are we going to encourage? Do we just quietly accept Annanova links, or do we encourage posters to only post fresh links? Will we just resign ourselves to wading through a mountain of recycled crap in order to find the few diamond posts, or will we encourage a culture of uniqueness and innovation?

Like government and television programming, we get the MetaFilter we deserve. This place is only what we make of it, no more and no less.
posted by gd779 at 10:21 AM on October 21, 2001

I agree with all of that, gd779. I merely wanted to point out that "quality" does imply a value judgement, and that any individual (other than Matt) or gang who tries to claim the prerogative of making that judgement call exclusively does so with no moral credibility. I'm all about consensus, however.
posted by rushmc at 11:58 AM on October 21, 2001

True skallas, that would be a nice test, and I for one might actually pay a little more attention to mefi that week. But still, with the thousands, or perhaps tens of thousands of media-esque sites out there (think: your local paper online), it wouldn't be any more difficult to link up some banal news story about the news of the day.

Although it might help.

(thinks this subject is getting to be quite the 'dead horse')
posted by canoeguide at 3:23 AM on October 22, 2001

Actually it would be more difficult. It wouldn't be hard, but it would be harder than it is now, and that might make all the difference.
posted by sudama at 7:27 AM on October 22, 2001

« Older What to call a double comment?   |   Is groksoup dead? Newer »

You are not logged in, either login or create an account to post comments