How impossible would it be to merge thread displays for double-posts? January 22, 2002 7:18 AM Subscribe
How impossible would it be to merge thread displays for double-posts? If, for example, there are parallel discussions and small cries of "double post you fool," why not parse the threads together, so each one includes posts from both threads, in date order? Voila, no more repeated conversations or wasted threads.
Pardon any programming naivete in the suggestion; I'd love to know whether this is feasible (and if it would be considered useful or confusing).
Pardon any programming naivete in the suggestion; I'd love to know whether this is feasible (and if it would be considered useful or confusing).
Perhaps if Matt went in to the second post;
• deleted all of the harkers of "double-post"
• posted a form letter on double posting and what you can do to avoid it
• turned off the comments on that thread to prevent more nonsense
This way, you could open the thread, see it as a double-post and redirect to the original - there will be none of the typical "Hey, this is a double-post" and "I looked for it, honest" - just a simple, but firm, repremand from Matt, discouraging future sloppy FPP.
posted by hotdoughnutsnow at 8:50 AM on January 22, 2002
• deleted all of the harkers of "double-post"
• posted a form letter on double posting and what you can do to avoid it
• turned off the comments on that thread to prevent more nonsense
This way, you could open the thread, see it as a double-post and redirect to the original - there will be none of the typical "Hey, this is a double-post" and "I looked for it, honest" - just a simple, but firm, repremand from Matt, discouraging future sloppy FPP.
posted by hotdoughnutsnow at 8:50 AM on January 22, 2002
just a simple, but firm, repremand from Matt, discouraging future sloppy FPP.
Well, that suggests that double-posts are always sloppy. Sometimes they're entirely by accident; sometimes a post intentionally recalls an older thread. I was just musing on the utility of seeing all relevant discussions on a single page--which, by my assessment, would actually legitimize the occasional innocent double-post if used discretionally (I dare say that's not a bad thing).
Matt?
posted by werty at 9:33 AM on January 22, 2002
Well, that suggests that double-posts are always sloppy. Sometimes they're entirely by accident; sometimes a post intentionally recalls an older thread. I was just musing on the utility of seeing all relevant discussions on a single page--which, by my assessment, would actually legitimize the occasional innocent double-post if used discretionally (I dare say that's not a bad thing).
Matt?
posted by werty at 9:33 AM on January 22, 2002
Doesn't Matt already consolidate double post comments into the original? I thought so, anyway.
posted by Hankins at 9:38 AM on January 22, 2002
posted by Hankins at 9:38 AM on January 22, 2002
I think it's a good idea. If it were technically possible to merge accidental or time-lagged double posts(say news items and revivals respectively)it would make reading and commenting much easier and richer.
Recently there have been several instances of spontaneous duplications where we had two concurrent threads - sometimes both interesting - that only the most dedicated members will have read.
Well *blush* I usually read both and, although each one has its own dialectic, dynamics and all that sort of shit, it would have been better, for normal readers who have better things to do, to merge them.
Deleting obviously spurious double posts is another question entirely. And, as I interpret werty's request, the idea is precisely to spare Matt the trouble of having to copy-and-paste the bastards, via some automatic wizard programming.
"Double post" is a far too wide-ranging term, anyway. ;)
posted by MiguelCardoso at 10:30 AM on January 22, 2002
Recently there have been several instances of spontaneous duplications where we had two concurrent threads - sometimes both interesting - that only the most dedicated members will have read.
Well *blush* I usually read both and, although each one has its own dialectic, dynamics and all that sort of shit, it would have been better, for normal readers who have better things to do, to merge them.
Deleting obviously spurious double posts is another question entirely. And, as I interpret werty's request, the idea is precisely to spare Matt the trouble of having to copy-and-paste the bastards, via some automatic wizard programming.
"Double post" is a far too wide-ranging term, anyway. ;)
posted by MiguelCardoso at 10:30 AM on January 22, 2002
Well, the programming would be pretty easy for that. I'll give a short, off-the-cuff explaination of how it could be done. Bear in mind though, I don't exactly know how Matt's administration script works, but I think I have an idea.
I know the site uses Cold Fusion's .cfml for it's html pre-processor. I'm assuming that this site is database driven (natch), probably using either MySQL or some derivitive.
If this is the case, you'd need pretty much 3 methods to take care of a dobule post, and some sort of form in the admin script (such as a pair of checkboxes) to indicate which two threads are the same (ie. which ones to merge together).
So, method #1 (let's call it is_double_post) would take the input from Matt's html form, grab the nessary information from the database for the 2 posts. It would then work something like this -
is_double_post($post_id_1, $post_id_2, //Whatever else is needed to connect to the db){
// Db connection commands
....
// Eventualy, you'd have the DB data in an associative array from both posts,
// in some variables, lets call them someData1, and someData2.
if(checkDatePosted($someData1, $someData2)){
merge($someData1, $someData2);
}else{
merge($someData2, $someData1);
}
}
The second method would check the dates of the two posts from the database. It returns true if date1 is eariler than date2, and false if otherwise. Just used to figure out which thread will retain it's link title and author.
checkDatePosted($someDataFrom1, $someDataFrom2){
if($someDataFrom1['date'] < $somedatafrom2['date']){br> return true;
}
elseif($someDataFrom1['date'] > $someDataFrom2['date']){
return false;
}
else{
return true;
}
}
The third method is the merge method. This will take the two posts, and get rid of the title, body, author and time of the second one. It will then take all of the messages posted to the second link, and plop them all on to the end of the first post. Maybe with some sort of caveat that reads "Linked from a now non-working thread" next to the post? Doesn't really matter, in terms of program flow.
I'm not going to write the merge method, because there are too many variables invloved (db connection protocol, how the db is stored, how articles are retrived, etc), but it's not hard at all to write. Probably under twenty lines.
Matt, if you're interested in adding this feature, just drop me an e-mail and let me know how the above variables work. I'd be happy to spit out the code for you.
-SJ>
posted by SweetJesus at 2:34 PM on January 22, 2002
I know the site uses Cold Fusion's .cfml for it's html pre-processor. I'm assuming that this site is database driven (natch), probably using either MySQL or some derivitive.
If this is the case, you'd need pretty much 3 methods to take care of a dobule post, and some sort of form in the admin script (such as a pair of checkboxes) to indicate which two threads are the same (ie. which ones to merge together).
So, method #1 (let's call it is_double_post) would take the input from Matt's html form, grab the nessary information from the database for the 2 posts. It would then work something like this -
is_double_post($post_id_1, $post_id_2, //Whatever else is needed to connect to the db){
// Db connection commands
....
// Eventualy, you'd have the DB data in an associative array from both posts,
// in some variables, lets call them someData1, and someData2.
if(checkDatePosted($someData1, $someData2)){
merge($someData1, $someData2);
}else{
merge($someData2, $someData1);
}
}
The second method would check the dates of the two posts from the database. It returns true if date1 is eariler than date2, and false if otherwise. Just used to figure out which thread will retain it's link title and author.
checkDatePosted($someDataFrom1, $someDataFrom2){
if($someDataFrom1['date'] < $somedatafrom2['date']){br> return true;
}
elseif($someDataFrom1['date'] > $someDataFrom2['date']){
return false;
}
else{
return true;
}
}
The third method is the merge method. This will take the two posts, and get rid of the title, body, author and time of the second one. It will then take all of the messages posted to the second link, and plop them all on to the end of the first post. Maybe with some sort of caveat that reads "Linked from a now non-working thread" next to the post? Doesn't really matter, in terms of program flow.
I'm not going to write the merge method, because there are too many variables invloved (db connection protocol, how the db is stored, how articles are retrived, etc), but it's not hard at all to write. Probably under twenty lines.
Matt, if you're interested in adding this feature, just drop me an e-mail and let me know how the above variables work. I'd be happy to spit out the code for you.
-SJ>
posted by SweetJesus at 2:34 PM on January 22, 2002
I was going to post the exact same thing, but SweetJesus just edged me out.
posted by Skot at 3:22 PM on January 22, 2002
posted by Skot at 3:22 PM on January 22, 2002
Skallas, I think double posts suck. It's annoying to read the same crap, two or three times on the front page.
But double posts of threads that are older (let's say -- 6 months) suck too. I really like Metafilter, and I like when the content is new (to the site -- not necessarily "fresh") and interesting. Remember how boring MeFi was when the server kept timing out? Nobody was searching, and the threads were all quadruple posts.
Oh, and SweetJesus rocks.
posted by jennak at 3:33 PM on January 22, 2002
But double posts of threads that are older (let's say -- 6 months) suck too. I really like Metafilter, and I like when the content is new (to the site -- not necessarily "fresh") and interesting. Remember how boring MeFi was when the server kept timing out? Nobody was searching, and the threads were all quadruple posts.
Oh, and SweetJesus rocks.
posted by jennak at 3:33 PM on January 22, 2002
of course, dps are no reason for people to act like assholes. agreed, skallas. but i still don't think they should remain on the site; removing them lets people know they should be more cautious.
posted by jennak at 5:28 PM on January 22, 2002
posted by jennak at 5:28 PM on January 22, 2002
DP's shouldn't even be an issue. They should be very easy, in theroy, to get rid of, or eliminate before they even get posted.
One good way would be to check the URL that's being used in the title. Have a flat file full of urls, say the last 60 or so, and just read the file into a string and use something that will split the file up into an array on /n characters (I know the explode() function does this in php, but I'm not sure about CF). If you find a dupe URL, just tell the user, and refuse to accept the post.
I'm not sure how much strain Matt's server is under (I know the search feature always seems to be pretty stressed), so this may be too many reads and write's to preform before posting.
Another good way is to let the users decide what's a double post. A small link could be placed on the top of each post (in the message area) that says something like -
Is this link a double post?
After this URL gets like 100 clicks or so, it would automaticly delete the post. Of course, this way is really insecure, and probably not the best way to administer a site of this size, but it's an option.
I'm just trying to present easy soloutions to the problem.
Oh, and thanks Jennak :-)
-SJ
posted by SweetJesus at 7:41 PM on January 22, 2002
One good way would be to check the URL that's being used in the title. Have a flat file full of urls, say the last 60 or so, and just read the file into a string and use something that will split the file up into an array on /n characters (I know the explode() function does this in php, but I'm not sure about CF). If you find a dupe URL, just tell the user, and refuse to accept the post.
I'm not sure how much strain Matt's server is under (I know the search feature always seems to be pretty stressed), so this may be too many reads and write's to preform before posting.
Another good way is to let the users decide what's a double post. A small link could be placed on the top of each post (in the message area) that says something like -
Is this link a double post?
After this URL gets like 100 clicks or so, it would automaticly delete the post. Of course, this way is really insecure, and probably not the best way to administer a site of this size, but it's an option.
I'm just trying to present easy soloutions to the problem.
Oh, and thanks Jennak :-)
-SJ
posted by SweetJesus at 7:41 PM on January 22, 2002
A while back I'd thought of a similar method to report a DP. Have a link under the thread on the comments page asking "Is this thread a double post?" with a short one line form to give the reason why it is a DP. If 3 seperate MeFi members send in a DP notification, the DP link at the bottom of the page disappears. Matt gets notified with all 3 reasons, and if the thread is a DP, he deals with it accordingly, else the thread goes back to being normal.
posted by riffola at 8:37 PM on January 22, 2002
posted by riffola at 8:37 PM on January 22, 2002
One good way would be to check the URL that's being used in the title.
Metafilter already does this, give it a try. Instead of refusing the post it just warns you. The real problem with doubles is that there's about thirty sites out there with the same AP or Reuters text, so you could get dozens of different URLs with the exact same story.
Well, as I said, it's a pretty simple fix, not a complex soloution. It's gonna work in some cases, but not in all of them. I think the best way to deal with it, is to use the suggestion I posted a few post's ago, but whatever.
Caveat Emptor
-SJ
posted by SweetJesus at 9:53 PM on January 22, 2002
Metafilter already does this, give it a try. Instead of refusing the post it just warns you. The real problem with doubles is that there's about thirty sites out there with the same AP or Reuters text, so you could get dozens of different URLs with the exact same story.
Well, as I said, it's a pretty simple fix, not a complex soloution. It's gonna work in some cases, but not in all of them. I think the best way to deal with it, is to use the suggestion I posted a few post's ago, but whatever.
Caveat Emptor
-SJ
posted by SweetJesus at 9:53 PM on January 22, 2002
You are not logged in, either login or create an account to post comments
posted by riffola at 7:55 AM on January 22, 2002