Hi All,
 
I have a list of text files, some contain url's and some don't.  I'm trying
to extract the url's and print only the url to a new file w/ the same name
in a new directory.   I think I'm close, but the new files are empty..   If
someone could please take a look and let me know what I'm doing wrong I sure
would appreciate it..  Thank you!   
 
Oh yeah, I'm also getting a 
 
# ------  start
 

#!/usr/bin/perl
 
use warnings;
use strict;
use Cwd;
use File::Basename;
use Regexp::Common qw /URI/;
 
my $dir = "C:/brian/small";
 opendir (SM, $dir) or die "Can't open $dir: $!";
 
my @files = map { "$dir/$_" } grep { !/^\./ } readdir SM;
 
close SM;
 
foreach my $file (@files) {
    my ($basename) = fileparse($file,'.txt');
       
       open(LINK, "> $basename.txt") or warn "$!\n";
       my $text = '';
       open(TEXT, "< $file") or warn "$!\n";
       read( TEXT, $text, -s TEXT );       
       print LINK "$text $1\n"                                  # <------
Use of uninitialized value in pattern match 
       and close TEXT and next
       if /$RE{URI}{HTTP}{-keep}/;
        
    }

# ----- end 
 
Brian Volk
HP Products
317.298.9950 x1245
 <mailto:[EMAIL PROTECTED]> [EMAIL PROTECTED]
 
 

Reply via email to