> Hi All, 
>  
> I have a directory full of .html documents ( 2000 ) named:
>  
> 12345.html
> 11111.html
> 77548.html
> 45451.html
> 12132.html
> and so on....
>  
> I have another directory of files w/ item numbers that match the .html
> documents. :
>  
> file_1.txt        file_2.txt        file_3.txt
> 11111            12132            45451
> 77548            77548            11111
> 12132                                 77548
> 77548                                            
>  
> I want to read the first item # in file_1.txt ( 11111)  and select
> 11111.html from the .html dir and send it as an email 
> attachment.  I have
> been reading up on hash(s) but I'm not sure if that is what I 
> should be
> using...  Maybe File::Find, not sure....  so I went w/  @ARGV 
> for the .txt
> files and foreach for the .html's.  Below is what I have put 
> together so
> far... I really need help w/ the while <> and the foreach...  
> :~)   I think
> I have the email piece working OK....     Any advice would be greatly
> appreciated.   Thanks!
>  
>  
> --- begin
> #!/usr/bin/perl
>  
> use strict;
> use warnings;
> use MIME::Lite;
> use Net::SMTP;
>  
> # email information 
> my $from_address = '[EMAIL PROTECTED]' <mailto:'[EMAIL PROTECTED]'> ;
> my $to_address = '[EMAIL PROTECTED]' 
> <mailto:'[EMAIL PROTECTED]'> ;
> my $mail_host = 'smtp.bvolk.com';
> my $subject = $ARGV;
> my $message_body = "Attached is the msds for" $ARGV "\n ...";
> my $html_file_zip = 'c:\brian\test\orders';
> my $email_file_zip = '$file';
>  
> # directory or .html doc's 
> my $html_dir = "C:/brian/test/html";
>  opendir (HTML, $html_dir) or die "Can't open $html_dir: $!";
>  
> my @files = map { "$html_dir/$_" } grep { !/^\./ } readdir HTML;
>  
> close HTML;
>  
> # directory of .txt doc's
> my $orders_dir = "C:/brian/test/orders";
>  opendir (ORDERS, $orders_dir) or die "Can't open $orders_dir: $!";
>  
> # load @ARGV for <> operator
> @ARGV = map { "$orders_dir/$_" } grep { !/^\./ } readdir ORDERS;
>  
> foreach my $file (@files) { 
>  
>  while (<>) { 
>        if $ARGV =~ $file {        
>       my $msg = MIME::Lite->new (
>       From => $from_address,
>       To => $to_address,
>       Subject => $subject,
>       Type =>'multipart/mixed'
>       ) or die "Error creating multipart container: $!\n";   
>  
>       $msg->attach (
>       Type => 'TEXT',
>       Data => $message_body
>       ) or die "Error adding the text message part: $!\n";
>  
>       $msg->attach (
>       Type => 'application/zip',
>       Path => $html_file_zip,
>       Filename => $email_file_zip,
>       Disposition => 'attachment'
>       ) or die "Error adding $email_file_zip: $!\n";
>         
>       MIME::Lite->send('smtp', $mail_host, Timeout=>60);
>       $msg->send;
>      
>                }
>            }
>       
> closedir (ORDERS);
>  
> ---end
>  

I think a hash is what I need to be doing but I'm not sure how to load the
keys and the values.  I'm sure I want the keys to be the files in the .html
directory and the values to be the item numbers in all the .txt files.  This
way I can create the hash to function as a table lookup..  Right?  

Could someone pls point me in the right direction....

# directory or .html doc's 
my $html_dir = "C:/brian/test/html";
 opendir (HTML, $html_dir) or die "Can't open $html_dir: $!";
  
my @htmls = map { "$html_dir/$_" } grep { !/^\./ } readdir HTML;

# directory of .txt doc's
my $orders_dir = "C:/brian/test/orders";
 opendir (ORDERS, $orders_dir) or die "Can't open $orders_dir: $!";
  
# load @ARGV for <> operator
@ARGV = map { "$orders_dir/$_" } grep { !/^\./ } readdir ORDERS;

# -----  not sure how to get above values in to hash ----- 
my %hash = ( );
my @htmls = keys %hash;
my @item_numbers = values %hash;

And then I realize I'll need to create a subroutine that will attach the key
($htmls) that matches the value ($item_number) to an email.

TIA!

Brian Volk 

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>


Reply via email to