Re: Long urls - update

2013-04-01 Thread Luis Mochan
> Unix shell handles variables abysmally. You need to help it a lot to > do the right thing. *Always* quote variables, else if they're empty > they tend to blow up on you. Thanks for the advice! Your script did work from the command line, but it was not enough when called from extract_url.pl. Any

Re: Long urls - update

2013-04-01 Thread s. keeling
Incoming from Luis Mochan: > > I don't know much about shell programming, but I found that > /etc/urlhandler/url_handler.sh is a shell script that obtains its url > doing '$url=$1'. I replaced the whole handler by the following > program: > #! /bin/bash > url=$1; shift > echo $url >>t

Re: Long urls - update

2013-04-01 Thread Luis Mochan
Hi Kyle, > I'm the author of extract_url.pl, so perhaps I can shed some light > here. Thanks. > The *correct* place to "fix" the issue of escaping (or otherwise > sanitizing) ampersands is in the sanitizeuri function (line 208). The > current version of extract_url.pl uses this: > > sub sa

Re: Long urls - update

2013-04-01 Thread Kyle Wheeler
-BEGIN PGP SIGNED MESSAGE- Hash: SHA256 On Sunday, March 31 at 11:16 PM, quoth Luis Mochan: >> I'm a perl guy, yet that's non-trivial here. Thx. :-) >> > You're welcome. I don't know if there are other characters that appear > in an url and need to be escaped for the shell ([;><]?); the

Re: Long urls - update

2013-04-01 Thread Luis Mochan
> By the way, the author of the program, Kyle Wheeler, wrote to me that > he expects that adding the line > COMMAND /etc/urlview/url_handler.sh '%s' > to the configuration file ~/.extract_urlview would be enough to solve > the problem (with %s between quotes). I believe I had tried that and >

Re: Long urls - update

2013-04-01 Thread Luis Mochan
Hi John, > I guess I'm the slow one on the list. > Is there more to the patch than commenting out > > # $command =~ s/%s/'$url'/g; > > and replacing it with > > $command=~s/&/\\&/g I didn't comment out that line; it is needed to replace %s by the URL in the 'COMMAND' that actually opens the URL

Re: Long urls - update

2013-04-01 Thread s. keeling
Incoming from John Niendorf: > > I guess I'm the slow one on the list. > Is there more to the patch than commenting out > > # $command =~ s/%s/'$url'/g; > > and replacing it with > > $command=~s/&/\\&/g > > Because either way, extract_url.pl isn't working for me. It looks like that was incorr

Re: Long urls - update

2013-04-01 Thread John Niendorf
Hi Guys, I guess I'm the slow one on the list. Is there more to the patch than commenting out # $command =~ s/%s/'$url'/g; and replacing it with $command=~s/&/\\&/g Because either way, extract_url.pl isn't working for me. I can see the list of urls, but if I click on one I still get a page

Re: Long urls - update

2013-04-01 Thread Chris Bannister
On Tue, Apr 02, 2013 at 01:06:05AM +1300, Chris Bannister wrote: > On Sun, Mar 31, 2013 at 10:00:58PM -0600, s. keeling wrote: > > Incoming from Luis Mochan: > > > I found a mistake in the extract_url.pl program: it doesn't escape > > > ampersands when present in the url, so when the command to act

Re: Long urls - update

2013-04-01 Thread Chris Bannister
On Sun, Mar 31, 2013 at 10:00:58PM -0600, s. keeling wrote: > Incoming from Luis Mochan: > > I found a mistake in the extract_url.pl program: it doesn't escape > > ampersands when present in the url, so when the command to actually > > view the url is invoked, the shell gets confused. I made a quic

Re: Long urls - update

2013-04-01 Thread James Griffin
Sun 31.Mar'13 at 15:37:28 -0600 Luis Mochan > I found a mistake in the extract_url.pl program: it doesn't escape > ampersands when present in the url, so when the command to actually > view the url is invoked, the shell gets confused. I made a quick fix > by substitu