Hello R people,

I have created a '.csv' file of 100 rows by 20 columns whose each cell 
contains a random numbers between 0 & 1, thru a Java program. Once that is 
created a signal (just a letter) is send to the port of a socket 
connection at "localhost", which was earlier started by an "R" session. 
Now the "R" reads the '.csv' file into a data frame and calculates the 
average of 2000 numbers. This mean value was then written to the socket 
connection and Java received that successfully. The duration of this 
processing at "R" session took 1 min 03 seconds.
My question : Is the time duration of 63 seconds OK for only this set of 
activities or it should be less? If so how it can be reduced? Point to be 
noted that both the Java and R were running on the same PC and the 62 
seconds were consumed for the "readLines" command.

Thanks and regards.
Aniruddha.
=====-----=====-----=====
Notice: The information contained in this e-mail
message and/or attachments to it may contain 
confidential or privileged information. If you are 
not the intended recipient, any dissemination, use, 
review, distribution, printing or copying of the 
information contained in this e-mail message 
and/or attachments to it are strictly prohibited. If 
you have received this communication in error, 
please notify us by reply e-mail or telephone and 
immediately and permanently delete the message 
and any attachments. Thank you



        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to