When will we automatically detect doubles? March 23, 2000 7:49 PM   Subscribe

When will MetaFilter be able to block links that have already been posted by other people? I think it should only be one or two lines of code to compare the URL against a database of previous links (plus a line to perform some sort of fuzzy logic for variations on links). Oh, and I think he should add programming to keep people from posting lame stuff too.
Like this post.
And that one down there.
posted by CrazyUncleJoe to Feature Requests at 7:49 PM (3 comments total)

I see your point cujoe. In the past couple of weeks the repetition of previous posts has become a commonplace. However, what about posts that refer to the front page of a commonly linked site?
posted by Ms Snit at 12:43 AM on March 24, 2000


I was talking about this last night with a certain weblogging elitist bastard. Yeah, the search engine currently indexes everything, including URLs, so I could do a check in the db, and spit out a "this appears to have been posted before, are you sure you want to post this?" Let me see what I can do.
posted by mathowie (staff) at 8:45 AM on March 24, 2000


Actually, my point was that in a forum like this, you're going to get some repetition and you're going to get some posts that are of varying quality. Essentially, I was making fun of the previous post by posting a duplicate of it. I guess I'm going to have to start adding this link when I make posts like that...
posted by CrazyUncleJoe at 10:43 AM on March 24, 2000


« Older Duplicate Link Pony Request   |   Tossed into metatalk again Newer »

You are not logged in, either login or create an account to post comments