> Unix shell handles variables abysmally. You need to help it a lot to
> do the right thing. *Always* quote variables, else if they're empty
> they tend to blow up on you.
Thanks for the advice! Your script did work from the command line, but
it was not enough when called from extract_url.pl.
Any
Incoming from Luis Mochan:
>
> I don't know much about shell programming, but I found that
> /etc/urlhandler/url_handler.sh is a shell script that obtains its url
> doing '$url=$1'. I replaced the whole handler by the following
> program:
> #! /bin/bash
> url=$1; shift
> echo $url >>t
Hi Kyle,
> I'm the author of extract_url.pl, so perhaps I can shed some light
> here.
Thanks.
> The *correct* place to "fix" the issue of escaping (or otherwise
> sanitizing) ampersands is in the sanitizeuri function (line 208). The
> current version of extract_url.pl uses this:
>
> sub sa
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
On Sunday, March 31 at 11:16 PM, quoth Luis Mochan:
>> I'm a perl guy, yet that's non-trivial here. Thx. :-)
>>
> You're welcome. I don't know if there are other characters that appear
> in an url and need to be escaped for the shell ([;><]?); the
> By the way, the author of the program, Kyle Wheeler, wrote to me that
> he expects that adding the line
> COMMAND /etc/urlview/url_handler.sh '%s'
> to the configuration file ~/.extract_urlview would be enough to solve
> the problem (with %s between quotes). I believe I had tried that and
>
Hi John,
> I guess I'm the slow one on the list.
> Is there more to the patch than commenting out
>
> # $command =~ s/%s/'$url'/g;
>
> and replacing it with
>
> $command=~s/&/\\&/g
I didn't comment out that line; it is needed to replace %s by the URL
in the 'COMMAND' that actually opens the URL
Incoming from John Niendorf:
>
> I guess I'm the slow one on the list.
> Is there more to the patch than commenting out
>
> # $command =~ s/%s/'$url'/g;
>
> and replacing it with
>
> $command=~s/&/\\&/g
>
> Because either way, extract_url.pl isn't working for me.
It looks like that was incorr
Hi Guys,
I guess I'm the slow one on the list.
Is there more to the patch than commenting out
# $command =~ s/%s/'$url'/g;
and replacing it with
$command=~s/&/\\&/g
Because either way, extract_url.pl isn't working for me.
I can see the list of urls, but if I click on one I still get a page
On Tue, Apr 02, 2013 at 01:06:05AM +1300, Chris Bannister wrote:
> On Sun, Mar 31, 2013 at 10:00:58PM -0600, s. keeling wrote:
> > Incoming from Luis Mochan:
> > > I found a mistake in the extract_url.pl program: it doesn't escape
> > > ampersands when present in the url, so when the command to act
On Sun, Mar 31, 2013 at 10:00:58PM -0600, s. keeling wrote:
> Incoming from Luis Mochan:
> > I found a mistake in the extract_url.pl program: it doesn't escape
> > ampersands when present in the url, so when the command to actually
> > view the url is invoked, the shell gets confused. I made a quic
Sun 31.Mar'13 at 15:37:28 -0600 Luis Mochan
> I found a mistake in the extract_url.pl program: it doesn't escape
> ampersands when present in the url, so when the command to actually
> view the url is invoked, the shell gets confused. I made a quick fix
> by substitu
11 matches
Mail list logo