Hello Kyle, On Fri, Mar 7, 2008 at 5:00 PM, Kyle Wheeler <[EMAIL PROTECTED]> wrote: > I can think of two ways, both have their flaws.
Actually it would be better to fix the source of the problem instead of trying to find a workaround... But I don't know where these URLs get splitted at first. Perhaps you could enlight me ? > > 1. You can set pipe_decode before piping the message to urlview; > that way mutt will reconnect all the lines according to the way > they were encoded, and URLs won't be split. On the other hand, > if you use w3m as your html renderer, many URLs will simply not > be visible to urlview. > > 2. Pipe it through tr to removes all newlines and spaces before > piping it to urlview. Like this: > > macro pager \cb "<pipe-message>tr -d ' \r\n' | urlview<enter>" > > The second option seems to be the best... but it has the problem that > it may concatenate urls that shouldn't be concatenated (For example, > imagine the sentence "Go to http://www.google.com/ and tell me what > you think" - the url would become > http://www.google.com/andtellmewhatyouthink Why in this case spaces would be deleted ? > > Perhaps it's better to do this: > > 3. Pipe it through lynx to extract the urls before piping it to > urlview, like so: > > macro pager \cb "<pipe-message>lynx --force-html --dump | > urlview<enter>" > Yes. It would be nice to apply this macro for html emails only. For text emails just do the usual/fast thing. Thank you for your usefull feedbacks. -- Francis