Noonereallyneedstotypeasinglewordthatsoverthirtycharactersorso. June 22, 2007 2:36 PM   Subscribe

How about a filter that limits posts from having words over a certain length? Usually it's pasted URLs, and they always break the tables and require a manual cleanup. For instance.
posted by smackfu to Feature Requests at 2:36 PM (40 comments total) 1 user marked this as a favorite

Ha. Someone fixed it already. The link to "this" in that post used to just be the huge URL in plaintext, unlinked.
posted by smackfu at 2:39 PM on June 22, 2007


well the admins will be ecstatic that they get to close another metatalk post.
posted by shmegegge at 2:46 PM on June 22, 2007


Yeah, caught that. It was a biiiiig URL. It doesn't break in IE, interestingly enough.

But a filter doesn't seem very necessary—these things happen pretty rarely and then get flagged so fast and hard that we can pretty much taste the exclamation marks. It'll be rare that something gets really broken for more than a few minutes.

And it might not be so trivial: the url is perfectly legit, so how do we establish that the bare url is not okay but the url inside of an href is? Some regex could get us part of the way there, but then we've got more blackbox behavior from the site for imperfect gains, and we're still vulnerable in a broken-post sense to badly formed html and such.
posted by cortex (staff) at 2:48 PM on June 22, 2007


well the admins will be ecstatic that they get to close another metatalk post.

Well, I knew it would be fixed quickly (though I would have bet on jessamyn), since it breaks the whole front page in Firefox. The long word ends up below the sidebar, with a huge blank space above it.

But, like cortex says, it does get fixed fast enough by hand, so a technical solution would only be a nice to have.
posted by smackfu at 2:52 PM on June 22, 2007


Man, the title of this post really borked my RSS reader...
posted by vacapinta at 3:10 PM on June 22, 2007


But what if I wanted to make a post about acetylseryltyrosylserylisoleucylthreonylserylprolylserylglutaminylphenylalanylvalylphenylalanylleucylserylserylvalyltryptophylalanylaspartylprolylisoleucylglutamylleucylleucylasparaginylvalylcysteinylthreonylserylserylleucylglycylasparaginylglutaminylphenylalanylglutaminylthreonylglutaminylglutaminylalanylarginylthreonylthreonylglutaminylvalylglutaminylglutaminylphenylalanylserylglutaminylvalyltryptophyllysylprolylphenylalanylprolylglutaminylserylthreonylvalylarginylphenylalanylprolylglycylaspartylvalyltyrosyllysylvalyltyrosylarginyltyrosylasparaginylalanylvalylleucylaspartylprolylleucylisoleucylthreonylalanylleucylleucylglycylthreonylphenylalanylaspartylthreonylarginylasparaginylarginylisoleucylisoleucylglutamylvalylglutamylasparaginylglutaminylglutaminylserylprolylthreonylthreonylalanylglutamylthreonylleucylaspartylalanylthreonylarginylarginylvalylaspartylaspartylalanylthreonylvalylalanylisoleucylarginylserylalanylasparaginylisoleucylasparaginylleucylvalylasparaginylglutamylleucylvalylarginylglycylthreonylglycylleucyltyrosylasparaginylglutaminylasparaginylthreonylphenylalanylglutamylserylmethionylserylglycylleucylvalyltryptophylthreonylserylalanylprolylalanylserine?
posted by goodnewsfortheinsane at 3:18 PM on June 22, 2007 [2 favorites]


Again?
posted by cortex (staff) at 3:25 PM on June 22, 2007 [1 favorite]


I think i've asked this before, but wouldn't a better solution be to autiomatically linkify urls?

Then if it's over a certain length, it gets shortened to http://blah.com/longurl...whatever.html with the link intact.
posted by empath at 3:26 PM on June 22, 2007


But that wouldn't reduce the size of the resulting string. Where's the win?
posted by cortex (staff) at 3:29 PM on June 22, 2007


The display string gets shortened with "...," the actual URL stays the same. Why not do it?
posted by devilsbrigade at 3:39 PM on June 22, 2007


Ah! I was taking that viking a little too figuratively.

As for auto-linkifying, I know there have been a few previous discussions, not sure what Matt's current opinion on doing it is. I've always been kind of down on it, because I feel like it'd encourage laziness about it—right now, a lot of things with bare urls get flagged and/or hollered about, whereas the autolinkery might be taken as condoning that kind of cut-and-paste move officially.
posted by cortex (staff) at 3:43 PM on June 22, 2007


Metafilter: We take HTML lessons from Microsoft Word.
posted by blue_beetle at 3:47 PM on June 22, 2007


"...we can pretty much taste the exclamation marks."

Tangy?
posted by bru at 4:02 PM on June 22, 2007


How about a filter...


Wait...a META Meta Filter Filter?
posted by niles at 4:04 PM on June 22, 2007


I've always been kind of down on it, because I feel like it'd encourage laziness about it

There's nothing wrong with laziness. But we need to use it appropriately, for the benefit of the many rather than the few. In this case, there are far more people reading posts than writing them. Reader laziness in wanting to avoid seeing raw URLs in posts should take precedence over writer laziness in wanting to avoid learning how links work.
posted by scottreynen at 4:08 PM on June 22, 2007


I don't like auto-link. What if I want to mention an URL but not link to it? Linking isn't hard, and when people ask how to do it we can just tell them. Then they learn something.
posted by LobsterMitten at 4:22 PM on June 22, 2007


What if I want to mention an URL but not link to it?

It violates the Ts and the Cs old boy, don't you know? We're all here just to push a post-grad project under a sockpuppet account.
posted by yerfatma at 7:16 PM on June 22, 2007


Again?

Coat Protein, Tobacco Mosaic Virus, Dahlemense Strain.

What?
posted by goodnewsfortheinsane at 7:30 PM on June 22, 2007


A more accepted "longest word in English" would be "floccinaucinihilipilification", and it's kind of apt for Metafilter.
posted by goodnewsfortheinsane at 7:32 PM on June 22, 2007


Tangy?

Bitter.
posted by deborah at 7:48 PM on June 22, 2007


Wait...a META Meta Filter Filter?

No.
posted by Arturus at 10:50 PM on June 22, 2007


Maybe we need to automatically truncate comments that exceed a certain le
posted by The Deej at 11:26 PM on June 22, 2007


What, no supercalifragilisticexpialidocious or antidisestablishmentarianism?
posted by IndigoRain at 12:41 AM on June 23, 2007


Metafilter: taking that viking a little too figuratively.
posted by Pope Guilty at 1:25 AM on June 23, 2007


What if I want to mention an URL but not link to it?

but.. why? i've never understood this. i mean, if someone reading your mentioned URL didn't know what was there, they could just copy and paste, or type it in to their browser if they were so inclined.

it's not like by making a URL unclickable you're hiding some information or anything, just making extra work for the person who wants to find out what lives there.

anyway, maybe a useful compromise to this would be to have a little "autolink URLs" checkbox down there next to the bold/italic/link buttons that was on by default, and could be turned off if the poster was so inclined. or a user page setting that was on by default. (by default so that clueless people get their lazily-pasted links linked).
posted by sergeant sandwich at 2:51 AM on June 23, 2007 [1 favorite]


The strongest reason to have an unlinked URL is in the scenario where you don't want the end website to have the click-through information for whatever reason. Alternatively, you want to reference a website that people should not be going to if they are not prepared for it (known malware infecting sites, for example). There are valid reasons to link to such sites, depending on the situation, but having it possible for someone to accidentally click the link is a Bad Idea.
posted by Arturus at 11:42 AM on June 23, 2007 [1 favorite]


I like Arcturus' point; however, if you really wanted to solve that problem you could just have the autolinker key on "http://" prefixes; that way if you wanted to paste a link you could just start with the domain.

My question is: why would auto-linking solve the truncating problem? I assuming that the auto-link would just make the long-ass link text into a link itself, but the long-ass link would still be there.

The problem here is demonstrated by goodnewsfortheinsane's comment above -- why not have word wrap for exceedingly long words?

Could this be hacked by putting empty tags in the middle of such words? I don't know if that would cause the word to line-break, but it's worth a try.
posted by spiderwire at 8:16 PM on June 23, 2007


from smackfu,

how does one limit from something?

to,

quonsar
posted by quonsar at 8:19 PM on June 23, 2007


OK, I just tested it out and it works -- if you put a pair of, say < i> tags in the middle of the word with nothing inbetween, then Safari at least wraps the word but otherwise leave it unbroken. Moreover, if you select, copy, and past the string into the URL field it seems to remain unbroken.

A very simple regex could detect words that are, say, more than 80 characters long, and put empty tags in the middle of them.

Just a thought. I guess this might screw up someone, somewhere, but I'm having trouble coming up with many situations where you'd legitimately need to protect strings longer than 80 characters.
posted by spiderwire at 8:21 PM on June 23, 2007


acetylseryltyrosylserylisoleucylthreonylserylprolylserylglyclphenylalanylvalylphenylalanylleucylserylserylvalyltryptophylalanylaspartylprolylisoleucylglutamylleucylleucylasparaginylvalylcysteinylthreonylserylserylleucylglycylasparaginylglutaminylphenylalanylglutaminylthreonylglutaminylglutaminylalanylarginylthreonylthreonylglutaminylvalylglutaminylglutaminylphenylalanylserylglutaminylvalyltryptophyllysylprolylphenylalanylprolylglutaminylserylthreonylvalylarginylphenylalanylprolylglycylaspartylvalyltyrosyllysylvalyltyrosylarginyltyrosylasparaginylalanylvalylleucylaspartylprolylleucylisoleucylthreonylalanylleucylleucylglycylthreonylphenylalanylaspartylthreonylarginylasparaginylarginylisoleucylisoleucylglutamylvalylglutamylasparaginylglutaminylglutaminylserylprolylthreonylthreonylalanylglutamylthreonylleucylaspartylalanylthreonylarginylarginylvalylaspartylaspartylalanylthreonylvalylalanylisoleucylarginylserylalanylasparaginylisoleucylasparaginylleucylvalylasparaginylglutamylleucylvalylarginylglycylthreonylglycylleucyltyrosylasparaginylglutaminylasparaginylthreonylphenylalanylglutamylserylmethionylserylglycylleucylvalyltryptophylthreonylserylalanylprolylalanylserine
posted by spiderwire at 8:22 PM on June 23, 2007


Sorry about that; just wanted to test it. Works in my browser, though.
posted by spiderwire at 8:23 PM on June 23, 2007


This page has some suggestions as well, but they're not totally cross-browser compatible. I like my solution better -- maybe another option would be to insert an empty < div> in the middle of super-long strings to create an optional word-break spot. I need to go check some other browsers to make sure it's compatible in all of them.

I realize that this is a little bit of a kluge, but again, long, page-breaking strings annoy all of us, and the admins always have to fix them, so I don't have a big problem with sticking a little bit of hackish HTML in the middle of them to make things look nicer.

MeFi already filters img tags and does a bunch of other forced-formatting on comments, and my solution strikes me as just as aesthetically-objectionable as, say, pissing elephants. Or, well.... it strikes me as aesthetically objectionable. I'll just leave it at that.
posted by spiderwire at 8:48 PM on June 23, 2007


Well, it doesn't work in FireFox. Screw it. That page I linked does suggest a few markup characters like < wbr> that work in some browsers. I say toss those suckers in there.

OK, done flooding this thread now.
posted by spiderwire at 8:50 PM on June 23, 2007


OK, one last thing. <i>&shy;<wbr></i> seems to work to make invisible, optional line breaks in Firefox and Safari 3.0... people are welcome to try monkeying with it but Googling seems to indicate that it's an ongoing problem that's hoped to be solved in CSS3.

The regex of course is pretty simple, might look something like ([^ ]{50})... but I'm out of practice with regexes and web design, so I really am going to leave it now. It would be nice to get this fixed though -- it's been going on for forever now.
posted by spiderwire at 9:17 PM on June 23, 2007


oh, and spiderwire, as a little nitpicky aside, it's Arturus, not Arcturus. No c.
posted by Arturus at 12:10 AM on June 24, 2007


Sorry about that; I was on a bit of a caffeine tear there, if it wasn't painfully apparent already.
posted by spiderwire at 12:28 AM on June 24, 2007


how does one limit from something?

Hey, people understand it, that makes it good enuff english. Amirite?
posted by smackfu at 3:29 PM on June 24, 2007


My question is: why would auto-linking solve the truncating problem? I assuming that the auto-link would just make the long-ass link text into a link itself, but the long-ass link would still be there.

Most boards that do auto-linking also truncate long links.
posted by smackfu at 3:31 PM on June 24, 2007


Here's a vote against auto-linking. Imagine seeing something useless like http://example.com linked.
posted by philomathoholic at 1:19 AM on June 25, 2007


Yes, that would be truely tragic.
posted by smackfu at 9:43 AM on June 25, 2007


« Older The Upside of Treading Lightly   |   Beautiful Balls of Mud Newer »

You are not logged in, either login or create an account to post comments